A Survey of Secure Data Deduplication
نویسندگان
چکیده
منابع مشابه
PerfectDedup: Secure Data Deduplication
With the continuous increase of cloud storage adopters, data deduplication has become a necessity for cloud providers. By storing a unique copy of duplicate data, cloud providers greatly reduce their storage and data transfer costs. Unfortunately, deduplication introduces a number of new security challenges. We propose PerfectDedup, a novel scheme for secure data deduplication, which takes into...
متن کاملDeduplication in Hybrid Cloud with Secure Data
Deduplication is also called single instance technique, deduplication remove redundant data and stores original copy of data so it will saves the storage space to protect sensitive data. The data security and access to particular data is very much important in current days hence the features in deduplication have been widely used in cloud storage system. There was drawback in previous work wher...
متن کاملCloud Based Data Deduplication with Secure Reliability
IJRAET Abstract— To eliminate duplicate copies of data we use data de-duplication process. As well as it is used in cloud storage to minimize memory space and upload bandwidth only one copy for every file stored in cloud that can be used by more number of users. Deduplication process helps to improve storage space. Another challenge of privacy for sensitive data also arises. The aim of this pap...
متن کاملA Survey On: Secure Data Deduplication on Hybrid Cloud Storage Architecture
Data deduplication is one of the most important Data compression techniques used for to removing the duplicate copies of repeating data and it is widely used in the cloud storage for the purpose of reduce the storage space and save bandwidth. To keep the confidentiality of sensitive data while supporting the deduplication, to encrypt the data before outsourcing convergent encryption technique h...
متن کاملEstimation of Secure Data Deduplication in Big Data
Bigdata is linked with the entireties of composite data sets. In bigdata environment, data is in the form of unstructured data and may contain number of duplicate copies of same data. To manage such a complex unstructured data hadoop is to be used. A hadoop is an open source platform specially designed for bigdata environment. Hadoop can handle unstructured data very efficiently as compare to t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Computer Applications
سال: 2016
ISSN: 0975-8887
DOI: 10.5120/ijca2016908986